Generative Analogies as Mental Models
نویسنده
چکیده
When people are reasoning about an unfamiliar domain, they often appear to use analogies, as in the above example from a protocol. Analogies are also used in teaching, as in the following excerpt: The idea that electricity flows as water does is a good analogy. Picture the wires as pipes carrying water (electrons). Your wall plug is a high-pressure source which you can tap simply by inserting a plug. The plug has two prongs--one to take the flow to the lamp, radio, or air conditioner, the second to conduct the flow back to the wall. A valve (switch) is used to start or stop flow.
منابع مشابه
Sampling Generative Networks: Notes on a Few Effective Techniques
We introduce several techniques for effectively sampling and visualizing the latent spaces of generative models. Replacing linear interpolation with spherical linear interpolation (slerp) prevents diverging from a model’s prior distribution and produces sharper samples. J-Diagrams and MINE grids are introduced as visualizations of manifolds created by analogies and nearest neighbors. We demonst...
متن کاملBeyond candidate inferences: People treat analogies as probabilistic truths
People use analogies for many cognitive purposes such as building mental models, making inspired guesses, and extracting relational structure. Here we examine whether and how analogies may have more direct influence on knowledge: Do people treat analogies as probabilistically true explanations for uncertain propositions? We report an experiment that explores how a suggested analogy can influenc...
متن کاملGenerating Sentences by Editing Prototypes
We propose a new generative model of sentences that first samples a prototype sentence from the training corpus and then edits it into a new sentence. Compared to traditional models that generate from scratch either left-toright or by first sampling a latent sentence vector, our prototype-then-edit model improves perplexity on language modeling and generates higher quality outputs according to ...
متن کاملRandom walks on discourse spaces: a new generative language model with applications to semantic word embeddings
Semantic word embeddings represent the meaning of a word via a vector, and are created by diverse methods such as Latent Semantic Analysis (LSA), generative text models such as topic models, matrix factorization, neural nets, and energy-based models. Many methods use nonlinear operations —such as Pairwise Mutual Information or PMI— on co-occurrence statistics, and have hand-tuned hyperparameter...
متن کاملThe Riemannian Geometry of Deep Generative Models
Deep generative models learn a mapping from a lowdimensional latent space to a high-dimensional data space. Under certain regularity conditions, these models parameterize nonlinear manifolds in the data space. In this paper, we investigate the Riemannian geometry of these generated manifolds. First, we develop efficient algorithms for computing geodesic curves, which provide an intrinsic notion...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2005